9,336 research outputs found

    Strong inapproximability of the shortest reset word

    Full text link
    The \v{C}ern\'y conjecture states that every nn-state synchronizing automaton has a reset word of length at most (n1)2(n-1)^2. We study the hardness of finding short reset words. It is known that the exact version of the problem, i.e., finding the shortest reset word, is NP-hard and coNP-hard, and complete for the DP class, and that approximating the length of the shortest reset word within a factor of O(logn)O(\log n) is NP-hard [Gerbush and Heeringa, CIAA'10], even for the binary alphabet [Berlinkov, DLT'13]. We significantly improve on these results by showing that, for every ϵ>0\epsilon>0, it is NP-hard to approximate the length of the shortest reset word within a factor of n1ϵn^{1-\epsilon}. This is essentially tight since a simple O(n)O(n)-approximation algorithm exists.Comment: extended abstract to appear in MFCS 201

    Design of dimensional model for clinical data storage and analysis

    Get PDF
    Current research in the field of Life and Medical Sciences is generating chunk of data on daily basis. It has thus become a necessity to find solutions for efficient storage of this data, trying to correlate and extract knowledge from it. Clinical data generated in Hospitals, Clinics & Diagnostics centers is falling under a similar paradigm. Patient’s records in various hospitals are increasing at an exponential rate, thus adding to the problem of data management and storage. Major problem being faced corresponding to storage, is the varied dimensionality of the data, ranging from images to numerical form. Therefore there is a need for development of efficient data model which can handle this multi-dimensionality data issue and store the data with historical aspect. For the stated problem lying in façade of clinical informatics we propose a clinical dimensional model design which can be used for development of a clinical data mart. The model has been designed keeping in consideration temporal storage of patient's data with respect to all possible clinical parameters which can include both textual and image based data. Availability of said data for each patient can be then used for application of data mining techniques for finding the correlation of all the parameters at the level of individual and population

    Evaluation of antioxidant and antimutagenic potential of Justicia adhatoda leaves extract

    Get PDF
    In this study, the ethanolic extract of Justicia adhatoda (Acanthaceae) leaves was prepared by successive extraction procedure in increasing polarity order. Moreover, there are no antimutagenic evaluation reports found. In the present study our aim was to determine the antioxidant and antimutagenic potential of different fractions of ethanolic extract of J. adhatoda. Ultra high performance liquid chromatography (UHPLC) analysis revealed the presence of polyphenolic compounds and flavonoids which might be responsible for bioprotective activity. Among the five fractions (hexane, chloroform, ethyl acetate, n-butanol and aqueous), n-butanol and ethyl acetate exhibited significant antioxidant activity with minimum IC50 value (< 105.33 μg/ml) whereas, hexane, chloroform and aqueous fractions exhibits excellent antimutagenic potential against 2-aminofluorine for S. typhimurium TA98 and TA100 strains in the presence of S9 mix. These results indicate that these fractions need further research into its potential chemoprevention effects.Key words: Justicia adhatoda, antioxidant, antimutagenic, ultra-high performance liquid chromatography (UHPLC), IC50 value

    Stable Frank-Kasper phases of self-assembled, soft matter spheres

    Full text link
    Single molecular species can self-assemble into Frank Kasper (FK) phases, finite approximants of dodecagonal quasicrystals, defying intuitive notions that thermodynamic ground states are maximally symmetric. FK phases are speculated to emerge as the minimal-distortional packings of space-filling spherical domains, but a precise quantitation of this distortion and how it affects assembly thermodynamics remains ambiguous. We use two complementary approaches to demonstrate that the principles driving FK lattice formation in diblock copolymers emerge directly from the strong-stretching theory of spherical domains, in which minimal inter-block area competes with minimal stretching of space-filling chains. The relative stability of FK lattices is studied first using a diblock foam model with unconstrained particle volumes and shapes, which correctly predicts not only the equilibrium {\sigma} lattice, but also the unequal volumes of the equilibrium domains. We then provide a molecular interpretation for these results via self-consistent field theory, illuminating how molecular stiffness regulates the coupling between intra-domain chain configurations and the asymmetry of local packing. These findings shed new light on the role of volume exchange on the formation of distinct FK phases in copolymers, and suggest a paradigm for formation of FK phases in soft matter systems in which unequal domain volumes are selected by the thermodynamic competition between distinct measures of shape asymmetry.Comment: 40 pages, 22 figure

    The parameterized complexity of some geometric problems in unbounded dimension

    Full text link
    We study the parameterized complexity of the following fundamental geometric problems with respect to the dimension dd: i) Given nn points in \Rd, compute their minimum enclosing cylinder. ii) Given two nn-point sets in \Rd, decide whether they can be separated by two hyperplanes. iii) Given a system of nn linear inequalities with dd variables, find a maximum-size feasible subsystem. We show that (the decision versions of) all these problems are W[1]-hard when parameterized by the dimension dd. %and hence not solvable in O(f(d)nc){O}(f(d)n^c) time, for any computable function ff and constant cc %(unless FPT=W[1]). Our reductions also give a nΩ(d)n^{\Omega(d)}-time lower bound (under the Exponential Time Hypothesis)

    Privacy and Truthful Equilibrium Selection for Aggregative Games

    Full text link
    We study a very general class of games --- multi-dimensional aggregative games --- which in particular generalize both anonymous games and weighted congestion games. For any such game that is also large, we solve the equilibrium selection problem in a strong sense. In particular, we give an efficient weak mediator: a mechanism which has only the power to listen to reported types and provide non-binding suggested actions, such that (a) it is an asymptotic Nash equilibrium for every player to truthfully report their type to the mediator, and then follow its suggested action; and (b) that when players do so, they end up coordinating on a particular asymptotic pure strategy Nash equilibrium of the induced complete information game. In fact, truthful reporting is an ex-post Nash equilibrium of the mediated game, so our solution applies even in settings of incomplete information, and even when player types are arbitrary or worst-case (i.e. not drawn from a common prior). We achieve this by giving an efficient differentially private algorithm for computing a Nash equilibrium in such games. The rates of convergence to equilibrium in all of our results are inverse polynomial in the number of players nn. We also apply our main results to a multi-dimensional market game. Our results can be viewed as giving, for a rich class of games, a more robust version of the Revelation Principle, in that we work with weaker informational assumptions (no common prior), yet provide a stronger solution concept (ex-post Nash versus Bayes Nash equilibrium). In comparison to previous work, our main conceptual contribution is showing that weak mediators are a game theoretic object that exist in a wide variety of games -- previously, they were only known to exist in traffic routing games

    Fast by Nature - How Stress Patterns Define Human Experience and Performance in Dexterous Tasks

    Get PDF
    In the present study we quantify stress by measuring transient perspiratory responses on the perinasal area through thermal imaging. These responses prove to be sympathetically driven and hence, a likely indicator of stress processes in the brain. Armed with the unobtrusive measurement methodology we developed, we were able to monitor stress responses in the context of surgical training, the quintessence of human dexterity. We show that in dexterous tasking under critical conditions, novices attempt to perform a task's step equally fast with experienced individuals. We further show that while fast behavior in experienced individuals is afforded by skill, fast behavior in novices is likely instigated by high stress levels, at the expense of accuracy. Humans avoid adjusting speed to skill and rather grow their skill to a predetermined speed level, likely defined by neurophysiological latency

    Lace: non-blocking split deque for work-stealing

    Get PDF
    Work-stealing is an efficient method to implement load balancing in fine-grained task parallelism. Typically, concurrent deques are used for this purpose. A disadvantage of many concurrent deques is that they require expensive memory fences for local deque operations.\ud \ud In this paper, we propose a new non-blocking work-stealing deque based on the split task queue. Our design uses a dynamic split point between the shared and the private portions of the deque, and only requires memory fences when shrinking the shared portion.\ud \ud We present Lace, an implementation of work-stealing based on this deque, with an interface similar to the work-stealing library Wool, and an evaluation of Lace based on several common benchmarks. We also implement a recent approach using private deques in Lace. We show that the split deque and the private deque in Lace have similar low overhead and high scalability as Wool

    Energy saving in fixed wireless broadband networks

    Get PDF
    International audienceIn this paper, we present a mathematical formulation for saving energy in fixed broadband wireless networks by selectively turning off idle communication devices in low-demand scenarios. This problem relies on a fixed-charge capacitated network design (FCCND), which is very hard to optimize. We then propose heuristic algorithms to produce feasible solutions in a short time.Dans cet article, nous proposons une modélisation en programme linéaire en nombres entiers pour le problème de minimiser la consommation d'énergie dans les réseaux de collecte à faisceaux hertziens en éteignant une partie des équipements lorsque le trafic est bas. Ce problème repose sur un problème de dimensionnement de réseaux dont les arcs ont une capacité fixe, qui est très difficile à résoudre. Nous proposons un algorithme heuristique fournissant rapidement des solutions réalisables
    corecore